Expected Posterior Prior Distributions for Model Selection
نویسنده
چکیده
Consider the problem of comparing parametric models M 1 ; : : : ; M k , when at least one of the models has an improper prior N i (i). Using the Bayes factor for comparing among these is not feasible due to arbitrary multiplicative constants in N (i). In this work we suggest adjusting the initial priors for each model, N i , by i (i) = Z N i (i jy)m (y)dy where m is a suitable predictive measure on (imaginary) training samples, y. The updated prior, , is called the expected posterior prior under m. Some properties of this approach include: (1) The resulting Bayes factors depend only on suucient statistics. (2) The resulting Bayesian inference is coherent and allows for multiple comparisons. (3) In many cases, it is possible to nd m such that, for a sample of minimal size, there is predictive matching for the comparisons of model M i to M j ,i.e., the Bayes factor B ij = 1. (4) In the case of nested models, where M 1 is nested in every other model, choosing m (y) to be the marginal of y under M 1 is asymptotically equivalent to the arithmetic Intrinsic Bayes Factor (Berger and Pericchi, 1996). The expected posterior prior scheme can be applied to a wide variety of statistical problems. Applications to the selection of linear models and for the default analysis of mixture models can be seen in P erez (1998).
منابع مشابه
Estimation of parameter of proportion in Binomial Distribution Using Adjusted Prior Distribution
Historically, various methods were suggested for the estimation of Bernoulli and Binomial distributions parameter. One of the suggested methods is the Bayesian method, which is based on employing prior distribution. Their sound selection on parameter space play a crucial role in reducing posterior Bayesian estimator error. At times, large scale of the parametric changes on parameter space bring...
متن کاملPower-Expected-Posterior Priors for Variable Selection in Gaussian Linear Models
Imaginary training samples are often used in Bayesian statistics to develop prior distributions, with appealing interpretations, for use in model comparison. Expected-posterior priors are defined via imaginary training samples coming from a common underlying predictive distribution m, using an initial baseline prior distribution. These priors can have subjective and also default Bayesian implem...
متن کاملA Moreau-yosida Approximation Scheme for High-dimensional Posterior and Quasi-posterior Distributions
Exact-sparsity inducing prior distributions in high-dimensional Bayesian analysis typically lead to posterior distributions that are very challenging to handle by standard Markov Chain Monte Carlo methods. We propose a methodology to derive a smooth approximation of such posterior distributions. The approximation is obtained from the forward-backward approximation of the Moreau-Yosida regulariz...
متن کاملBayesian Estimation of Reliability of the Electronic Components Using Censored Data from Weibull Distribution: Different Prior Distributions
The Weibull distribution has been widely used in survival and engineering reliability analysis. In life testing experiments is fairly common practice to terminate the experiment before all the items have failed, that means the data are censored. Thus, the main objective of this paper is to estimate the reliability function of the Weibull distribution with uncensored and censored data by using B...
متن کاملComputation for intrinsic variable selection in normal regression models via expected-posterior prior
In this paper we focus on the variable selection problem in normal regression models, using the expected-posterior prior methodology. We provide a straightforward MCMC scheme for the derivation of the posterior distribution, as well as Monte Carlo estimates for the computation of the marginal likelihood and posterior model probabilities. Additionally, for large model spaces, a model search algo...
متن کامل